EN FR
EN FR


Section: New Results

Statistical analysis of spike trains

Modern advances in neurophysiology techniques, such as two-photons imaging of calcium signals or micro-electrode arrays (MEA) electro-physiology, have made it possible to observe simultaneously the activity of large assemblies of neurons. Such experimental recordings provide a great opportunity to unravel the underlying interactions of neural assemblies and to understand how neural populations dynamically encode information. The goal of the present project is to propose to the neuroscientists community statistical methods and numerical tools to analysing the statistics of action potentials (spike trains) obtained from MEA recordings. Our work is grounded on one hand on theoretical results on Gibbs distributions in neural networks and the other hand on a C/C++ library of algorithms developed joinly with the CORTEX INRIA team, freely available at http://enas.gforge.inria.fr/ . We have collaborations with several labs specialized in MEA recording from the retina: Centro Interdisciplinario de Neurociencia de Valparaiso, Universidad de Valparaiso, Chile http://www.cinv.cl/ ; Department of Molecular Biology and Princeton Neuroscience Institute, Princeton University, USA http://www.princeton.edu/neuroscience/ ; Institut de la Vision, Paris http://www.institut-vision.org/ .

A discrete time neural network model with spiking neurons. Dynamics with noise.

Participant : Bruno Cessac [correspondent] .

We provide rigorous and exact results characterizing the statistics of spike trains in a network of leaky Integrate-and-Fire neurons, where time is discrete and where neurons are submitted to noise, without restriction on the synaptic weights. We show the existence and uniqueness of an invariant measure of Gibbs type and discuss its properties. We also discuss Markovian approximations and relate them to the approaches currently used in computational neuroscience to analyse experimental spike trains statistics. This work has appeared in Journal of Mathematical Biology[17] .

Statistics of spike trains in conductance-based neural networks: Rigorous results

Participant : Bruno Cessac [correspondent] .

We consider a conductance-based neural network inspired by the generalized Integrate and Fire model introduced by Rudolph and Destexhe in 1996. We show the existence and uniqueness of a unique Gibbs distribution characterizing spike train statistics. The corresponding Gibbs potential is explicitly computed. These results hold in the presence of a time-dependent stimulus and apply therefore to non-stationary dynamics. This establishes a rigorous ground for the current investigations attempting to characterize real spike trains data with Gibbs distributions, such as the Ising-like distribution, using the maximal entropy principle. This work has appeared in Journal of Mathematical Neuroscience [18]

Spike Train Statistics from Empirical Facts to Theory: The Case of the Retina.

Participants : Bruno Cessac [correspondent] , Adrian Palacios [Centro de Neurociencia, Valparaiso, Chile] .

This work focuses on methods from statistical physics and probability theory allowing the analysis of spike trains in neural networks. Taking as an example the retina we present recent works attempting to understand how retina ganglion cells encode the information transmitted to the visual cortex via the optical nerve, by analyzing their spike train statistics. We compare the maximal entropy models used in the literature of retina spike train analysis to rigorous results establishing the exact form of spike train statistics in conductance-based Integrate-and-Fire neural networks. This work is submitted in “Mathematical Problems in Computational Biology and Biomedicine” Springer, [54] .

Gibbs distribution analysis of temporal correlations structure in retina ganglion cells

Participants : Michael Berry II [Department of Molecular Biolog, Princeton University, USA] , Bruno Cessac [correspondent] , Olivier Marre, Adrian Palacios [Centro de Neurociencia, Valparaiso, Chile] , Juan-Carlos Vasquez.

We present a method to estimate Gibbs distributions with spatio-temporal constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (Marre et al. (2009)) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions This work has been accepted in Journal of Physiology, Paris [28] (in press).

A Markovian event-based framework for stochastic spiking neural networks

Participants : Olivier Faugeras, Jonathan Touboul.

In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

This work has appeared in the Journal of Computational Neuroscience [26] .